Optimal Convergence Rates for the Orthogonal Greedy Algorithm

نویسندگان

چکیده

We analyze the orthogonal greedy algorithm when applied to dictionaries $\mathbb {D}$ whose convex hull has small entropy. show that if metric entropy of decays at a rate notation="LaTeX">$O\left({n^{-\frac {1}{2}-\alpha }}\right)$ for notation="LaTeX">$\alpha > 0$ , then converges same on variation space . This improves upon well-known {1}{2}}}\right)$ convergence in many cases, most notably corresponding shallow neural networks. These results hold under no additional assumptions dictionary beyond decay its hull. In addition, they are robust noise target function and can be extended rates interpolation spaces norm. empirically predicted obtained networks with Heaviside activation two dimensions. Finally, we these improved sharp prove negative result showing iterates generated by cannot general bounded norm

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Rates for Greedy Kaczmarz Algorithms

We discuss greedy and approximate greedy selection rules within Kaczmarz algorithms for solving linear systems. We show that in some applications the costs of greedy and randomized rules are similar, and that greedy selection gives faster convergence rates. Further, we give a multi-step analysis of a particular greedy rule showing it can be much faster when many rows are orthogonal.

متن کامل

Convergence Rates of the Pod-greedy Method

Iterative approximation algorithms are successfully applied in parametric approximation tasks. In particular, reduced basis methods make use of the so called Greedy algorithm for approximating solution sets of parametrized partial differential equations. Recently, a-priori convergence rate statements for this algorithm have been given (Buffa et al 2009, Binev et al. 2010). The goal of the curre...

متن کامل

the algorithm for solving the inverse numerical range problem

برد عددی ماتریس مربعی a را با w(a) نشان داده و به این صورت تعریف می کنیم w(a)={x8ax:x ?s1} ، که در آن s1 گوی واحد است. در سال 2009، راسل کاردن مساله برد عددی معکوس را به این صورت مطرح کرده است : برای نقطه z?w(a)، بردار x?s1 را به گونه ای می یابیم که z=x*ax، در این پایان نامه ، الگوریتمی برای حل مساله برد عددی معکوس ارانه می دهیم.

15 صفحه اول

Threshold Estimation via Group Orthogonal Greedy Algorithm

Disclaimer: This is a version of an unedited manuscript that has been accepted for publication. As a service to authors and researchers we are providing this version of the accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proof will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may...

متن کامل

A Vectorial Kernel Orthogonal Greedy Algorithm

This work is concerned with derivation and analysis of a modified vectorial kernel orthogonal greedy algorithm (VKOGA) for approximation of nonlinear vectorial functions. The algorithm pursues simultaneous approximation of all vector components over a shared linear subspace of the underlying function Hilbert space in a greedy fashion [16, 37] and inherits the selection principle of the f /P-Gre...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2022

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2022.3147984